Review of time limit:
f = O ( g ) f= O(g) f = O ( g ) (Big-O ): ∃ k ∈ R 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) ≤ k g ( n ) ) ) ) \exists k \in \R_{0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n)\le kg(n)))) ∃ k ∈ R 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) ≤ k g ( n ))))
we can use big-O to describe functions' upper bound by some common function where as larger input there exists such k k k times the function upper the current function
aka f ≤ g f \le g f ≤ g
f = Ω ( g ) f = \Omega(g) f = Ω ( g ) (Big-Omega ): ∃ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) ≥ k g ( n ) ) ) ) \exists k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n)\ge kg(n)))) ∃ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) ≥ k g ( n ))))
we can use big-Omega to describe functions' lower bound by some common function where as larger input there exists such k k k times the function lower the current function
f ≥ g f\ge g f ≥ g
f = Θ ( g ) f = \Theta(g) f = Θ ( g ) (Big-Theta ): ∃ k 1 , k 2 ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ k 1 g ( n ) ≤ f ( n ) ≤ k 2 g ( n ) ) ) ) \exists k_1,k_2 \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies k_1g(n)\le f(n)\le k_2g(n)))) ∃ k 1 , k 2 ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ k 1 g ( n ) ≤ f ( n ) ≤ k 2 g ( n ))))
There exist another function bounded function in some area k 1 , k 2 k_1,k_2 k 1 , k 2 as larger input
f = O ( g ) ∧ f = Ω ( g ) ⟺ f = Θ ( g ) f=O(g) \land f = \Omega(g)\iff f = \Theta(g) f = O ( g ) ∧ f = Ω ( g ) ⟺ f = Θ ( g )
f ≈ g f\approx g f ≈ g
f = o ( g ) f = o(g) f = o ( g ) (Little-o ): ∀ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) < k g ( n ) ) ) ) \forall k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n) < kg(n)))) ∀ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) < k g ( n ))))
f = ω ( g ) f = \omega(g) f = ω ( g ) (Little-omega ): ∀ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) > k g ( n ) ) ) ) \forall k \in \R_{>0},(\exists n_0 \in \N(\forall n\in \N. (n > n_0\implies f(n) > kg(n)))) ∀ k ∈ R > 0 , ( ∃ n 0 ∈ N ( ∀ n ∈ N . ( n > n 0 ⟹ f ( n ) > k g ( n ))))
l o g log l o g in time limit analysis is always stand for l o g 2 log_2 l o g 2 and we always use φ \varphi φ to present the golden ratio
1 < l o g ( n ) < n 0.001 < n < n l o g ( n ) < n 1.001 < n 1000 < 1.00 1 n < 2 n 1 < log(n) < n^{0.001} < n < nlog(n) < n^{1.001} < n^{1000} < 1.001^{n} < 2^n 1 < l o g ( n ) < n 0.001 < n < n l o g ( n ) < n 1.001 < n 1000 < 1.00 1 n < 2 n for l i m n → ∞ lim_{n\to \infty} l i m n → ∞
For a recurrences, we can simply use r n r^n r n to judge the time complexity. For example, the Fibonacci:
Assume the time complexity of P ( k + 1 ) P(k+1) P ( k + 1 ) is r k + 1 r^{k+1} r k + 1 , and P ( k + 1 ) = P ( k ) + P ( k − 1 ) P(k+1) = P(k) + P(k-1) P ( k + 1 ) = P ( k ) + P ( k − 1 ) has time complexity of r k r^k r k and r k − 1 r^{k-1} r k − 1
then we have r k + 1 ≥ r k + r k − 1 ⟹ r k + 1 − r k − r k − 1 ≥ 0 ⟹ r 2 − r − 1 ≥ 0 r^{k+1} \ge r^k + r^{k-1}\implies r^{k+1} - r^k - r^{k-1}\ge 0 \implies r^{2} - r - 1 \ge 0 r k + 1 ≥ r k + r k − 1 ⟹ r k + 1 − r k − r k − 1 ≥ 0 ⟹ r 2 − r − 1 ≥ 0
To prove T = Θ ( f ) T = \Theta(f) T = Θ ( f ) :
First prove T = O ( f ) T = O(f) T = O ( f )
Remove all the floors and ceilings from the recurrence T T T
Make a guess for f f f such that T ( n ) = O ( f ( n ) ) T(n) = O(f(n)) T ( n ) = O ( f ( n ))
write out the recurrence: T ( n ) = … T(n) = \ldots T ( n ) = …
whenever T ( k ) T(k) T ( k ) appears on the RHS of the recurrence, substitute it with c f ( k ) cf(k) c f ( k )
Try to prove T ( n ) ≤ c f ( n ) T(n) \le cf(n) T ( n ) ≤ c f ( n )
Pick c c c to make your analysis work
Then use the same way to prove Ω ( f ) \Omega(f) Ω ( f )
A recurrence is in standard form if it is written as T ( n ) = a T ( n / b ) + f ( n ) T(n) = aT(n/b) + f(n) T ( n ) = a T ( n / b ) + f ( n ) . For some constants a ≥ 1 , b > 1 a\ge 1, b > 1 a ≥ 1 , b > 1 , and some function f : N → R f: \N \to \R f : N → R
a : a: a : the branching factor of the tree (how many children)
b : b: b : the reduction factor (length compare to main)
f ( n ) f(n) f ( n ) : the time complexity of non-recursive work
The height of the tree is l o g b ( n ) log_b(n) l o g b ( n )
The number of vertices at level h h h is a h a^h a h
The total non-recursive work done at level h h h is a h f ( n / b h ) a^hf(n/b^h) a h f ( n / b h )
Root Work. f ( n ) f(n) f ( n )
Leaf Work. a l o g b ( n ) f ( 1 ) = Θ ( n l o g b ( a ) ) 1 a^{log_b(n)} f(1) = \Theta(n^{log_b(a)})^1 a l o g b ( n ) f ( 1 ) = Θ ( n l o g b ( a ) ) 1
Summing up the levels, the total amount of work done is ∑ h = 0 l o g b ( n ) a h f ( n / b ) \sum_{h= 0} ^{log_b(n)} a^hf(n/b) ∑ h = 0 l o g b ( n ) a h f ( n / b )
We always use Master Theorem to solve such standard form; Let T ( n ) = a T ( n / b ) + f ( n ) T(n) = aT(n/b) + f(n) T ( n ) = a T ( n / b ) + f ( n ) , then there are three cases defined by the leaf work:
Leaf heavy: f ( n ) = O ( n l o g b ( a ) − ϵ ) f(n) = O(n^{log_b(a) - \epsilon}) f ( n ) = O ( n l o g b ( a ) − ϵ ) for some constant ϵ > 0 \epsilon > 0 ϵ > 0
Balanced: f ( n ) = Θ ( n l o g b ( a ) ) f(n) = \Theta(n^{log_b(a)}) f ( n ) = Θ ( n l o g b ( a ) )
Root heavy: f ( n ) = Ω ( n l o g b ( a ) + ϵ ) f(n) = \Omega(n^{log_b(a) + \epsilon}) f ( n ) = Ω ( n l o g b ( a ) + ϵ ) for some constant ϵ > 0 \epsilon > 0 ϵ > 0 , and a f ( n / b ) ≤ c f ( n ) af(n/b) \le cf(n) a f ( n / b ) ≤ c f ( n ) for some constant c < 1 c < 1 c < 1 for all sufficiently large n n n (Regularity Condition)
Combine those cases, we have: T ( n ) = { Θ ( n l o g b ( a ) ) Leaf heavy Case Θ ( n l o g b ( a ) l o g ( n ) ) Balanced Case Θ ( f ( n ) ) Root heavy Case T(n) = \begin{cases} \Theta(n^{log_b(a)}) & \text{Leaf heavy Case} \\ \Theta(n^{log_b(a)}log(n)) & \text{Balanced Case} \\\Theta({f(n)}) & \text{Root heavy Case} \end{cases} T ( n ) = ⎩ ⎨ ⎧ Θ ( n l o g b ( a ) ) Θ ( n l o g b ( a ) l o g ( n )) Θ ( f ( n ) ) Leaf heavy Case Balanced Case Root heavy Case
Base on this, we:
Write the recurrence in normal form to find the parameters a , b , f a,b,f a , b , f
Compare n l o g b ( a ) n^{log_b(a)} n l o g b ( a ) to f f f to determine the case split
Read off the asymptotics from the relevant case
Simplified Master Theorem: let f = n c f = n^c f = n c T ( n ) = { Θ ( n log b ( a ) ) a > b c Θ ( n c log ( n ) a = b c ) Θ ( n c ) a < b c T(n) = \begin{cases}\Theta(n^{\log_b(a)}) & a> b^c \\ \Theta(n^c \log(n) & a = b^c) \\ \Theta(n^c) & a <b^c \end{cases} T ( n ) = ⎩ ⎨ ⎧ Θ ( n l o g b ( a ) ) Θ ( n c log ( n ) Θ ( n c ) a > b c a = b c ) a < b c